-
Notifications
You must be signed in to change notification settings - Fork 2.5k
fix: always include temperature parameter for OpenAI compatible providers #7582
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…ders - Modified base-openai-compatible-provider.ts to always include temperature with fallback to defaultTemperature - Modified openai.ts to always include temperature with fallback to 0 (or DEEP_SEEK_DEFAULT_TEMPERATURE for DeepSeek reasoner models) - Updated tests to expect temperature to always be included with appropriate defaults - This fixes TabbyApi/ExLlamaV2 crashes when temperature is undefined/null Fixes #7581
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I reviewed my own code and found issues I didn't know I created.
| params.temperature = this.options.modelTemperature | ||
| // Always include temperature to prevent TabbyApi/ExLlamaV2 crashes | ||
| // Use explicitly set temperature, or fall back to defaultTemperature (which defaults to 0) | ||
| temperature: this.options.modelTemperature ?? this.defaultTemperature, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good implementation of the temperature fallback chain! The comment clearly explains why this is needed. However, I noticed the completePrompt method in this same file (line 117-134) still doesn't include the temperature parameter - this could still cause TabbyApi/ExLlamaV2 crashes when using the completion endpoint.
| requestOptions.temperature = DEEP_SEEK_DEFAULT_TEMPERATURE | ||
| // Always include temperature to prevent TabbyApi/ExLlamaV2 crashes | ||
| // Use explicitly set temperature, or DeepSeek default for reasoner models, or fall back to 0 | ||
| temperature: this.options.modelTemperature ?? (deepseekReasoner ? DEEP_SEEK_DEFAULT_TEMPERATURE : 0), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good fix for the streaming case! This ensures temperature is always included with appropriate defaults.
| : [systemMessage, ...convertToOpenAiMessages(messages)], | ||
| // Always include temperature to prevent TabbyApi/ExLlamaV2 crashes | ||
| // Use explicitly set temperature, or DeepSeek default for reasoner models, or fall back to 0 | ||
| temperature: this.options.modelTemperature ?? (deepseekReasoner ? DEEP_SEEK_DEFAULT_TEMPERATURE : 0), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Excellent - non-streaming mode also properly includes temperature now. The fallback chain (user setting -> DeepSeek default -> 0) is well thought out.
| model: model.id, | ||
| messages: [{ role: "user", content: prompt }], | ||
| // Always include temperature to prevent TabbyApi/ExLlamaV2 crashes | ||
| temperature: this.options.modelTemperature ?? 0, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good catch on the completePrompt method! This ensures consistency across all API calls.
|
Closing in favor of #7594 |
Summary
This PR fixes the TabbyApi/ExLlamaV2 crash issue introduced after v3.25.20 where the temperature parameter was being passed as
None/undefinedinstead of a valid float value.Problem
After commit 090737c (#7188), the temperature parameter was omitted when not explicitly set to allow backend services to use their configured defaults. However, some backends like TabbyApi/ExLlamaV2 expect a temperature value and crash when receiving
None.Solution
base-openai-compatible-provider.tsto always include temperature with fallback todefaultTemperatureopenai.tsto always include temperature with fallback to 0 (orDEEP_SEEK_DEFAULT_TEMPERATUREfor DeepSeek reasoner models)Testing
Impact
Fixes #7581
Important
Ensure temperature parameter is always included with defaults in OpenAI-compatible providers to prevent crashes.
base-openai-compatible-provider.tsandopenai.tsto prevent crashes in TabbyApi/ExLlamaV2.defaultTemperatureor provider-specific defaults (OpenAI: 0, Groq: 0.5, Roo: 0.7, DeepSeek Reasoner: 0.6) if not explicitly set.groq.spec.ts,openai.spec.ts, androo.spec.tsto verify temperature inclusion with appropriate defaults.This description was created by
for 8d2f767. You can customize this summary. It will automatically update as commits are pushed.